Schmidt-Hieber J - Nonparametric regression using deep neural networks with ReLU activation function
Table of Contents
URL: https://arxiv.org/abs/1708.06633
Aim
- The main theorem states that a network estimator is minimax rate optimal (up to log factors) if and only if the method almost minimizes the empirical risk
New Terms
- Minimax Estimator: δM , is an estimator which performs best in the worst possible case allowed in the problem.
- TODO Wavelet Series Estimators:
Key Results
- Optimal Estimation Rates for ML NNs with ReLU activation.
- Wavelet estimators can only achieve suboptimal rates under the composition assumption
Limitations
- Results limited to ReLU activate MultiLayer Feed Forward Neural Networks